AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
GPT2 Architecture Optimization

# GPT2 Architecture Optimization

Watashiha Gpt 6b
Apache-2.0
A Japanese Daikiri language model developed based on the GPT2 architecture, pre-trained and fine-tuned with Daikiri data.
Large Language Model Transformers Japanese
W
watashiha
1,831
7
Gerpt2 Large
MIT
GerPT2 is the large-scale version of the German GPT2, trained on the CC-100 corpus and German Wikipedia, excelling in German text generation tasks.
Large Language Model German
G
benjamin
75
9
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase